48 research outputs found

    Environmental exposure to metallic soil elements and risk of cancer in the UK population, using a unique linkage between THIN and BGS databases

    Get PDF
    Background: There have been many epidemiological studies into the influence of exposure to the most toxic elements on the risk of cancer in the workplace, mainly due to the exposure of certain occupational groups, or perhaps in populations near industrial sources. Toxic elements include arsenic, copper, nickel, and uranium; and many more of these elements have been shown to increase the risk of several different types of cancers in these highly-exposed groups. Many of these elements naturally exist in the soil, and the health impact of these levels of environmental exposures on the general population has received little attention to date possibly due to the belief that soil concentrations of these elements are too low to cause harm to the general population. Therefore, the long-term effect of such chronic exposure to metals in the soil remains unclear. Aims and objectives: The goals are to utilise a new resource known as THIN-GBASE for conducting a series of environmental epidemiological studies to test the hypothesis that BCC, lung and GIT cancers are associated with high exposure to certain low-level metals in soil. We sought to use this resource in determining which soil metals should be tested for predicting each of the cancer outcomes. Methods: For BCC, an ecological study was initially undertaken to assess the overall regional variation in BCC to provide national and contemporary breakdowns of incidence rates across the UK. The primary exposure of interest for BCC was low-level soil arsenic, and we therefore quantified soil arsenic exposure levels based on the UK national safety limits for arsenic [i.e. As-C4SLs = 35 mg/kg]. A population-based cohort study was conducted to quantify the risks associated between the development of BCC and increasing levels of exposure to soil arsenic. For lung cancer, a two-stage process was adopted: 1) data mining analysis using the correlation-based filter selection model was used to find the restricted set of soil metals were best predictors for lung cancer; and 2) a prospective cohort study was use where these sets of elements were fitted together (adjusted for confounding variables) in a multivariable Cox proportional-hazards model to determine the risks associated between the development of lung cancer, with increasing levels of exposure to each specific element. For GIT cancers, a three-stage process was adopted: stages 1 and 2 used a similar methodology for the lung cancer study. In stage 3, all GIT cancers were divided into three broader outcomes i.e. upper GIT (includes mouth & oesophagus), stomach (as standalone) and colorectal (includes small, large, rectum and anal canal) cancers. A multivariate competing risk survival model was adjusted for the three different GIT cancers as competing events to identify associations between any of the selected group of metals found in stage 1 and GIT-specific cancers. Results: For BCC, the findings for the ecological study show that overall EASRs & WASRs for BCC in the UK was 98.6 and 66.9 per 100,000 person-years, respectively. It indicates a large geographical variation in age-sex standardised incidence of BCC with the South East having the highest incidence of BCC (202.7/100,000 person-years), followed by South Central (193.5/100,000 person-years) and Wales (185.7/100,000 person-years). Incidence rates of BCC were substantially higher in the least socioeconomically deprived groups. It was observed that increasing levels of deprivation led to a decreased rate of BCC (p < 0.001). In terms of age groups, the largest annual increase was observed among those aged 30-49 years. Assessment for soil arsenic indicated that individuals living in areas with concentrations ≥35mg/kg significantly had an increased hazard of developing BCC (35-70mg/kg: adjusted HR 1.08, 95% CI: 1.02-1.14; ≥70mg/kg: adjusted HR 1.17, 95% CI: 1.09-1.28). Urban residents with the highest exposure of soil arsenic had the greatest risk of developing BCC (≥ 70.0 mg/kg: HR 1.18, 95% CI: 1.06-1.36). For lung cancer, the correlation-based filter selection model identified aluminium, lead and uranium as the appropriate set of exposures for modelling lung cancer risk. Complete adjustments of hazards model showed evidence of an increased risk of developing lung cancer with elevated concentrations for only soil aluminium at medium levels ranging between 47,000-61,600mg/kg. Urban residents with the highest exposure of soil aluminium had the greatest risk of developing lung cancer (≥ 61,600mg/kg: HR 1.12, 95% CI: 1.04-1.22). For GIT cancers, the correlation-based filter selection model identified seven elements i.e. aluminium, phosphorus, zinc, uranium, calcium, manganese, and lead, as the appropriate set of exposures for predicting GIT cancer risk. The complete adjustment for hazards model indicated that the risk of developing overall GIT cancers were significantly associated with elevated exposure levels of soil phosphorus only (873-1,127mg/kg: HR 1.08, 95% CI: 1.02-1.14; 1,127-1,456mg/kg: HR 1.07, 95% CI: 1.01-1.13; and ≥1,145mg/kg: HR 1.07, 95% CI: 1.01-1.13). There were no consistent relationships identified between any of the selected groups of elements and the GIT-specific cancer outcomes when adjusting for different GIT cancers as competing events. Conclusion: There appears to be slight evidence of BCC, respiratory and GIT cancer risk with elevated exposure to soil arsenic, aluminium and phosphorus, respectively. The series of investigations conducted for this research are one of the first, if not, contemporary UK-based study to present novel estimates for a group of ill-defined pollutants. This research demonstrates that linking geochemical data with electronic primary care medical records can be a valuable approach of proving whether long term exposure to low-level soil contaminants may have a health consequence in the population

    Environmental exposure to metallic soil elements and risk of cancer in the UK population, using a unique linkage between THIN and BGS databases

    Get PDF
    Background: There have been many epidemiological studies into the influence of exposure to the most toxic elements on the risk of cancer in the workplace, mainly due to the exposure of certain occupational groups, or perhaps in populations near industrial sources. Toxic elements include arsenic, copper, nickel, and uranium; and many more of these elements have been shown to increase the risk of several different types of cancers in these highly-exposed groups. Many of these elements naturally exist in the soil, and the health impact of these levels of environmental exposures on the general population has received little attention to date possibly due to the belief that soil concentrations of these elements are too low to cause harm to the general population. Therefore, the long-term effect of such chronic exposure to metals in the soil remains unclear. Aims and objectives: The goals are to utilise a new resource known as THIN-GBASE for conducting a series of environmental epidemiological studies to test the hypothesis that BCC, lung and GIT cancers are associated with high exposure to certain low-level metals in soil. We sought to use this resource in determining which soil metals should be tested for predicting each of the cancer outcomes. Methods: For BCC, an ecological study was initially undertaken to assess the overall regional variation in BCC to provide national and contemporary breakdowns of incidence rates across the UK. The primary exposure of interest for BCC was low-level soil arsenic, and we therefore quantified soil arsenic exposure levels based on the UK national safety limits for arsenic [i.e. As-C4SLs = 35 mg/kg]. A population-based cohort study was conducted to quantify the risks associated between the development of BCC and increasing levels of exposure to soil arsenic. For lung cancer, a two-stage process was adopted: 1) data mining analysis using the correlation-based filter selection model was used to find the restricted set of soil metals were best predictors for lung cancer; and 2) a prospective cohort study was use where these sets of elements were fitted together (adjusted for confounding variables) in a multivariable Cox proportional-hazards model to determine the risks associated between the development of lung cancer, with increasing levels of exposure to each specific element. For GIT cancers, a three-stage process was adopted: stages 1 and 2 used a similar methodology for the lung cancer study. In stage 3, all GIT cancers were divided into three broader outcomes i.e. upper GIT (includes mouth & oesophagus), stomach (as standalone) and colorectal (includes small, large, rectum and anal canal) cancers. A multivariate competing risk survival model was adjusted for the three different GIT cancers as competing events to identify associations between any of the selected group of metals found in stage 1 and GIT-specific cancers. Results: For BCC, the findings for the ecological study show that overall EASRs & WASRs for BCC in the UK was 98.6 and 66.9 per 100,000 person-years, respectively. It indicates a large geographical variation in age-sex standardised incidence of BCC with the South East having the highest incidence of BCC (202.7/100,000 person-years), followed by South Central (193.5/100,000 person-years) and Wales (185.7/100,000 person-years). Incidence rates of BCC were substantially higher in the least socioeconomically deprived groups. It was observed that increasing levels of deprivation led to a decreased rate of BCC (p < 0.001). In terms of age groups, the largest annual increase was observed among those aged 30-49 years. Assessment for soil arsenic indicated that individuals living in areas with concentrations ≥35mg/kg significantly had an increased hazard of developing BCC (35-70mg/kg: adjusted HR 1.08, 95% CI: 1.02-1.14; ≥70mg/kg: adjusted HR 1.17, 95% CI: 1.09-1.28). Urban residents with the highest exposure of soil arsenic had the greatest risk of developing BCC (≥ 70.0 mg/kg: HR 1.18, 95% CI: 1.06-1.36). For lung cancer, the correlation-based filter selection model identified aluminium, lead and uranium as the appropriate set of exposures for modelling lung cancer risk. Complete adjustments of hazards model showed evidence of an increased risk of developing lung cancer with elevated concentrations for only soil aluminium at medium levels ranging between 47,000-61,600mg/kg. Urban residents with the highest exposure of soil aluminium had the greatest risk of developing lung cancer (≥ 61,600mg/kg: HR 1.12, 95% CI: 1.04-1.22). For GIT cancers, the correlation-based filter selection model identified seven elements i.e. aluminium, phosphorus, zinc, uranium, calcium, manganese, and lead, as the appropriate set of exposures for predicting GIT cancer risk. The complete adjustment for hazards model indicated that the risk of developing overall GIT cancers were significantly associated with elevated exposure levels of soil phosphorus only (873-1,127mg/kg: HR 1.08, 95% CI: 1.02-1.14; 1,127-1,456mg/kg: HR 1.07, 95% CI: 1.01-1.13; and ≥1,145mg/kg: HR 1.07, 95% CI: 1.01-1.13). There were no consistent relationships identified between any of the selected groups of elements and the GIT-specific cancer outcomes when adjusting for different GIT cancers as competing events. Conclusion: There appears to be slight evidence of BCC, respiratory and GIT cancer risk with elevated exposure to soil arsenic, aluminium and phosphorus, respectively. The series of investigations conducted for this research are one of the first, if not, contemporary UK-based study to present novel estimates for a group of ill-defined pollutants. This research demonstrates that linking geochemical data with electronic primary care medical records can be a valuable approach of proving whether long term exposure to low-level soil contaminants may have a health consequence in the population

    Health status of returning refugees, internally displaced persons, and the host community in a post-conflict district in northern Sri Lanka: a cross-sectional survey.

    Get PDF
    BACKGROUND: Although the adverse impacts of conflict-driven displacement on health are well-documented, less is known about how health status and associated risk factors differ according to displacement experience. This study quantifies health status and quality of life among returning refugees, internally displaced persons, and the host community in a post-conflict district in Northern Sri Lanka, and explores associated risk factors. METHODS: We analysed data collected through a household survey (n = 570) in Vavuniya district, Sri Lanka. The effect of displacement status and other risk factors on perceived quality of life as estimated from the 36-item Short Form Questionnaire, mental health status from 9-item Patient Health Questionnaire, and self-reported chronic disease status were examined using univariable analyses and multivariable regressions. RESULTS: We found strong evidence that perceived quality of life was significantly lower for internally displaced persons than for the host community and returning refugees, after adjusting for covariates. Both mental health status and chronic disease status did not vary remarkably among the groups, suggesting that other risk factors might be more important determinants of these outcomes. CONCLUSIONS: Our study provides important insights into the overall health and well-being of the different displaced sub-populations in a post-conflict setting. Findings reinforce existing evidence on the relationship between displacement and health but also highlight gaps in research on the long-term health effects of prolonged displacement. Understanding the heterogeneity of conflict-affected populations has important implications for effective and equitable humanitarian service delivery in a post-conflict setting

    Call detail record aggregation methodology impacts infectious disease models informed by human mobility

    Get PDF
    This paper demonstrates how two different methods used to calculate population-level mobility from Call Detail Records (CDR) produce varying predictions of the spread of epidemics informed by these data. Our findings are based on one CDR dataset describing inter-district movement in Ghana in 2021, produced using two different aggregation methodologies. One methodology, "all pairs," is designed to retain long distance network connections while the other, "sequential" methodology is designed to accurately reflect the volume of travel between locations. We show how the choice of methodology feeds through models of human mobility to the predictions of a metapopulation SEIR model of disease transmission. We also show that this impact varies depending on the location of pathogen introduction and the transmissibility of infections. For central locations or highly transmissible diseases, we do not observe significant differences between aggregation methodologies on the predicted spread of disease. For less transmissible diseases or those introduced into remote locations, we find that the choice of aggregation methodology influences the speed of spatial spread as well as the size of the peak number of infections in individual districts. Our findings can help researchers and users of epidemiological models to understand how methodological choices at the level of model inputs may influence the results of models of infectious disease transmission, as well as the circumstances in which these choices do not alter model predictions

    Resource availability and capacity to implement multi-stranded cholera interventions in the north-east region of Nigeria

    Get PDF
    Background: Limited healthcare facility (HCF) resources and capacity to implement multi-stranded cholera interventions (water, sanitation, and hygiene (WASH), surveillance, case management, and community engagement) can hinder the actualisation of the global strategic roadmap goals for cholera control, especially in settings made fragile by armed conflicts, such as the north-east region of Nigeria. Therefore, we aimed to assess HCF resource availability and capacity to implement these cholera interventions in Adamawa and Bauchi States in Nigeria as well as assess their coordination in both states and Abuja where national coordination of cholera is based. Methods: We conducted a cross-sectional survey using a face-to-face structured questionnaire to collect data on multi-stranded cholera interventions and their respective indicators in HCFs. We generated scores to describe the resource availability of each cholera intervention and categorised them as follows: 0–50 (low), 51–70 (moderate), 71–90 (high), and over 90 (excellent). Further, we defined an HCF with a high capacity to implement a cholera intervention as one with a score equal to or above the average intervention score. Results: One hundred and twenty HCFs (55 in Adamawa and 65 in Bauchi) were surveyed in March 2021, most of which were primary healthcare centres (83%; 99/120). In both states, resource availability for WASH indicators had high to excellent median scores; surveillance and community engagement indicators had low median scores. Median resource availability scores for case management indicators ranged from low to moderate. Coordination of cholera interventions in Adamawa State and Abuja was high but low in Bauchi State. Overall, HCF capacity to implement multi-stranded cholera interventions was high, though higher in Adamawa State than in Bauchi State. Conclusions: The study found a marked variation in HCF resource availability and capacity within locations and by cholera interventions and identified cholera interventions that should be prioritised for strengthening as surveillance and laboratory, case management, and community engagement. The findings support adopting a differential approach to strengthening cholera interventions for better preparedness and response to cholera outbreaks

    Impact of water and sanitation services on cholera outbreaks in sub-Saharan Africa

    Get PDF
    While most parts of the world seem to have controlled cholera, the sub-Saharan African region is still suffering with the cholera outbreaks and struggling to restrain its incidence. Recent research attributes eighty three percent of cholera deaths between 2000 and 2015 to the sub-Saharan region. Poor water, sanitation and hygiene (WASH) services can be among the main risk factors contributing to the public health burden of cholera. Humans living in close proximity to one another in environments with poor hygiene conditions and little access to clean water is an explanation for how cholera takes root in non-coastal areas. The combination of these factors with the vulnerability of surface and groundwater resources to faecal contamination can favour onset and propagation of outbreaks. This study investigated the correlation between cholera rates per population and lack of basic services of drinking water and sanitation in the sub-Saharan African countries, where incident cases of cholera have been regularly reported to the World Health Organization (WHO) since 1991

    Coalescing disparate data sources for the geospatial prediction of mosquito abundance, using Brazil as a motivating case study

    Get PDF
    One of the barriers to performing geospatial surveillance of mosquito occupancy or infestation anywhere in the world is the paucity of primary entomologic survey data geolocated at a residential property level and matched to important risk factor information (e.g., anthropogenic, environmental, and climate) that enables the spatial risk prediction of mosquito occupancy or infestation. Such data are invaluable pieces of information for academics, policy makers, and public health program managers operating in low-resource settings in Africa, Latin America, and Southeast Asia, where mosquitoes are typically endemic. The reality is that such data remain elusive in these low-resource settings and, where available, high-quality data that include both individual and spatial characteristics to inform the geospatial description and risk patterning of infestation remain rare. There are many online sources of open-source spatial data that are reliable and can be used to address such data paucity in this context. Therefore, the aims of this article are threefold: (1) to highlight where these reliable open-source data can be acquired and how they can be used as risk factors for making spatial predictions for mosquito occupancy in general; (2) to use Brazil as a case study to demonstrate how these datasets can be combined to predict the presence of arboviruses through the use of ecological niche modeling using the maximum entropy algorithm; and (3) to discuss the benefits of using bespoke applications beyond these open-source online data sources, demonstrating for how they can be the new “gold-standard” approach for gathering primary entomologic survey data. The scope of this article was mainly limited to a Brazilian context because it builds on an existing partnership with academics and stakeholders from environmental surveillance agencies in the states of Pernambuco and Paraiba. The analysis presented in this article was also limited to a specific mosquito species, i.e., Aedes aegypti, due to its endemic status in Brazil

    An Evaluation of the OpenWeatherMap API versus INMET Using Weather Data from Two Brazilian Cities: Recife and Campina Grande

    Get PDF
    Certain weather conditions are inadvertently related to increased population of various mosquitoes. In order to predict the burden of mosquito populations in the Global South, it is imperative to integrate weather-related risk factors into such predictive models. There are a lot of online open-source weather platforms that provide historical, current and future weather forecasts which can be utilised for general predictions, and these electronic sources serve as an alternate option for weather data when physical weather stations are inaccessible (or inactive). Before using data from such online source, it is important to assess the accuracy against some baseline measure. In this paper, we therefore evaluated the accuracy and suitability of weather forecasts of two parameters namely temperature and humidity from the OpenWeatherMap API (an online weather platform) and compared them with actual measurements collected from the Brazilian weather stations (INMET). The evaluation was focused on two Brazilian cites, namely, Recife and Campina Grande. The intention is to prepare an early warning model which will harness data from OpenWeatherMap API for mosquito prediction

    Temporal and Spatiotemporal Arboviruses Forecasting by Machine Learning: A Systematic Review

    Get PDF
    Arboviruses are a group of diseases that are transmitted by an arthropod vector. Since they are part of the Neglected Tropical Diseases that pose several public health challenges for countries around the world. The arboviruses' dynamics are governed by a combination of climatic, environmental, and human mobility factors. Arboviruses prediction models can be a support tool for decision-making by public health agents. In this study, we propose a systematic literature review to identify arboviruses prediction models, as well as models for their transmitter vector dynamics. To carry out this review, we searched reputable scientific bases such as IEE Xplore, PubMed, Science Direct, Springer Link, and Scopus. We search for studies published between the years 2015 and 2020, using a search string. A total of 429 articles were returned, however, after filtering by exclusion and inclusion criteria, 139 were included. Through this systematic review, it was possible to identify the challenges present in the construction of arboviruses prediction models, as well as the existing gap in the construction of spatiotemporal models

    Exploring barriers to guideline implementation for prescription of surgical antibiotic prophylaxis in Nigeria.

    Get PDF
    Background: In Nigeria, the prescription of surgical antibiotic prophylaxis for prevention of surgical site infection tends to be driven by local policy rather than by published guidelines (e.g. WHO and Sanford). Objectives: To triangulate three datasets and understand key barriers to implementation using a behavioural science framework. Methods: Surgeons (N = 94) from three teaching hospitals in Nigeria participated in an online survey and in focus group discussions about barriers to implementation. The theoretical domains framework (TDF) was used to structure question items and interview schedules. A subgroup (N = 20) piloted a gamified decision support app over the course of 6 months and reported barriers at the point of care. Results: Knowledge of guidelines and intention to implement them in practice was high. Key barriers to implementation were related to environmental context and resources and concern over potential consequences of implementing recommendations within the Nigerian context applicable for similar settings in low-to-middle-income countries. Conclusions: The environmental context and limited resource setting of Nigerian hospitals currently presents a significant barrier to implementation of WHO and Sanford guidelines. Research and data collected from the local context must directly inform the writing of future international guidelines to increase rates of implementation
    corecore